at ICLR 2018 L EARNING FROM B ETWEEN - CLASS E XAMPLES FOR D EEP S OUND R ECOGNITION

نویسندگان

  • Yuji Tokozume
  • Yoshitaka Ushiku
  • Tatsuya Harada
چکیده

Deep learning methods have achieved high performance in sound recognition tasks. Deciding how to feed the training data is important for further performance improvement. We propose a novel learning method for deep sound recognition: Between-Class learning (BC learning). Our strategy is to learn a discriminative feature space by recognizing the between-class sounds as between-class sounds. We generate between-class sounds by mixing two sounds belonging to different classes with a random ratio. We then input the mixed sound to the model and train the model to output the mixing ratio. The advantages of BC learning are not limited only to the increase in variation of the training data; BC learning leads to an enlargement of Fisher’s criterion in the feature space and a regularization of the positional relationship among the feature distributions of the classes. The experimental results show that BC learning improves the performance on various sound recognition networks, datasets, and data augmentation schemes, in which BC learning proves to be always beneficial. Furthermore, we construct a new deep sound recognition network (EnvNet-v2) and train it with BC learning. As a result, we achieved a performance surpasses the human level1.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Predicting functional sites in disordered proteins - implications in disease

Proteins that exist without a stable, well-defined 3D structure in their isolated form (Intrinsically Disordered/Unstructured Proteins – IDPs/IUPs) are known to play c rucial roles in biological systems. IDPs a re of ten involved in molecular r ecognition pr ocesses via t heir di sordered bi nding r egions t hat c an recognize pa rtner m olecules b y unde rgoing a c oupled f olding a nd bi ndin...

متن کامل

D Eep D Ictionary L Earning : S Ynergizing R E - Construction and C Lassification

Deep dictionary learning seeks multiple dictionaries at different image scales to capture complementary coherent characteristics. We propose a method for learning a hierarchy of synthesis dictionaries with an image classification goal. The dictionaries and classification parameters are trained by a classification objective, and the sparse features are extracted by reducing a reconstruction loss...

متن کامل

Iclr 2018 a Ttention - B Ased G Uided S Tructured S Parsity of D Eep N Eural N Etworks

Network pruning is aimed at imposing sparsity in a neural network architecture by increasing the portion of zero-valued weights for reducing its size regarding energy-efficiency consideration and increasing evaluation speed. In most of the conducted research efforts, the sparsity is enforced for network pruning without any attention to the internal network characteristics such as unbalanced out...

متن کامل

Amitraz Poisoning; A case study

A m i t r a z, a n i ns e c t i c i d e /a ca ri c i de of the f o r m a m i d i n e p e st i c i d e s group, is a ? 2 a d r e n e r g i c ag on i st a nd of t he a m i d i ne c h e m i ca l f a m il y generally us e d to c o n t r ol animal e c top a r a s i t e s. Poisoning due to am i t r a z i s r a r e and character...

متن کامل

Iclr 2018 F Ew - S Hot L Earning with G Raph N Eural N Et - Works

We propose to study the problem of few-shot learning with the prism of inference on a partially observed graphical model, constructed from a collection of input images whose label can be either observed or not. By assimilating generic message-passing inference algorithms with their neural-network counterparts, we define a graph neural network architecture that generalizes several of the recentl...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2018